Figures for "An adaptive algorithm for unsupervised learning"

This supplementary information presents :

  • first, the code to generate the figures from the paper,
  • second, some control experiments that were mentionned in the paper,
  • finally, some perspectives for future work inspired by the algorithms presented in the paper.
In [1]:
%matplotlib inline
In [2]:
%load_ext autoreload
%autoreload 2

A convenience script model.py allows to run and cache most learning items in this notebooks:

In [3]:
%run model.py
tag = HULK
n_jobs = 0
In [4]:
from shl_scripts.shl_experiments import SHL
shl = SHL(**opts)
data = shl.get_data(matname=tag)
In [5]:
shl?
Type:        SHL
String form: <shl_scripts.shl_experiments.SHL object at 0x132fcc250>
File:        ~/science/HULK/SparseHebbianLearning/shl_scripts/shl_experiments.py
Docstring:  
Base class to define SHL experiments:
    - initialization
    - coding and learning
    - visualization
    - quantitative analysis
In [6]:
print('# of pixels per patch =', shl.patch_width**2)
# of pixels per patch = 441
In [7]:
print('number of patches, size of patches = ', data.shape)
print('average of patches = ', data.mean(), ' +/- ', data.mean(axis=1).std())
SE = np.sqrt(np.mean(data**2, axis=1))
print('average energy of data = ', SE.mean(), '+/-', SE.std())
number of patches, size of patches =  (65520, 441)
average of patches =  -4.1888600727021664e-05  +/-  0.006270387629074682
average energy of data =  0.26082782604823146 +/- 0.07415089441760706
In [8]:
#!ls -l {shl.cache_dir}/
!ls -l {shl.cache_dir}/{tag}*
#!ls -ltr {shl.cache_dir}/{tag}*lock*
#!rm {shl.cache_dir}/{tag}*lock*
#!rm {shl.cache_dir}/{tag}*
#!rm {shl.cache_dir}/{tag}*HAP_seed*
#!ls -l {shl.cache_dir}/{tag}*
-rw-r--r--  1 laurentperrinet  staff    7589032 Aug 28 21:30 cache_dir/HULK - algorithm=elastic - homeo_method=HAP_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589033 Aug 26 15:36 cache_dir/HULK - algorithm=elastic - homeo_method=None_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589033 Aug 18 01:12 cache_dir/HULK - algorithm=elastic_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589029 Aug 28 12:40 cache_dir/HULK - algorithm=lars - homeo_method=HAP_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589030 Aug 26 06:46 cache_dir/HULK - algorithm=lars - homeo_method=None_dico.pkl
-rw-r--r--  1 laurentperrinet  staff          0 Aug 17 02:03 cache_dir/HULK - algorithm=lars_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff          0 Aug 17 02:03 cache_dir/HULK - algorithm=lars_dico.pkl_lock_pid-18082_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff    7589034 Aug 17 02:03 cache_dir/HULK - algorithm=lasso_cd_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589035 Aug 27 06:54 cache_dir/HULK - algorithm=lasso_lars - homeo_method=HAP_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589036 Aug 25 00:01 cache_dir/HULK - algorithm=lasso_lars - homeo_method=None_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589036 Aug 17 00:14 cache_dir/HULK - algorithm=lasso_lars_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 29 09:21 cache_dir/HULK - algorithm=mp - homeo_method=HAP_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 27 03:32 cache_dir/HULK - algorithm=mp - homeo_method=None_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 24 19:35 cache_dir/HULK - algorithm=mp_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 29 08:09 cache_dir/HULK - algorithm=omp - homeo_method=HAP_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589029 Aug 27 02:20 cache_dir/HULK - algorithm=omp - homeo_method=None_dico.pkl
-rw-r--r--  1 laurentperrinet  staff          0 Aug 18 01:12 cache_dir/HULK - algorithm=omp_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff          0 Aug 18 01:12 cache_dir/HULK - algorithm=omp_dico.pkl_lock_pid-20774_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:20 cache_dir/HULK_EMP_eta=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:22 cache_dir/HULK_EMP_eta=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 06:57 cache_dir/HULK_EMP_eta=0.00200_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:24 cache_dir/HULK_EMP_eta=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 06:57 cache_dir/HULK_EMP_eta=0.00356_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:26 cache_dir/HULK_EMP_eta=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta=0.00632_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:28 cache_dir/HULK_EMP_eta=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta=0.01125_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:30 cache_dir/HULK_EMP_eta=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:56 cache_dir/HULK_EMP_eta=0.02000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:32 cache_dir/HULK_EMP_eta=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta=0.03557_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:33 cache_dir/HULK_EMP_eta=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:57 cache_dir/HULK_EMP_eta=0.06325_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:35 cache_dir/HULK_EMP_eta=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta=0.11247_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:57 cache_dir/HULK_EMP_eta=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:56 cache_dir/HULK_EMP_eta_homeo=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:56 cache_dir/HULK_EMP_eta_homeo=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta_homeo=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:56 cache_dir/HULK_EMP_eta_homeo=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 08:58 cache_dir/HULK_EMP_eta_homeo=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 10:58 cache_dir/HULK_EMP_eta_homeo=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 10:56 cache_dir/HULK_EMP_eta_homeo=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 12:03 cache_dir/HULK_EMP_eta_homeo=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 13 12:02 cache_dir/HULK_EMP_eta_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 11:01 cache_dir/HULK_EMP_l0_sparseness=10_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 11:54 cache_dir/HULK_EMP_l0_sparseness=14_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 13:07 cache_dir/HULK_EMP_l0_sparseness=21_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 14:40 cache_dir/HULK_EMP_l0_sparseness=29_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 16:48 cache_dir/HULK_EMP_l0_sparseness=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 19:41 cache_dir/HULK_EMP_l0_sparseness=59_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 09:45 cache_dir/HULK_EMP_l0_sparseness=5_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 10:19 cache_dir/HULK_EMP_l0_sparseness=7_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  9 23:36 cache_dir/HULK_EMP_l0_sparseness=84_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   15144003 Sep 10 07:54 cache_dir/HULK_EMP_n_dictionary=1352_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    1922404 Sep  9 23:56 cache_dir/HULK_EMP_n_dictionary=169_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   21402563 Sep 10 12:15 cache_dir/HULK_EMP_n_dictionary=1912_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    2704724 Sep 10 00:24 cache_dir/HULK_EMP_n_dictionary=239_dico.pkl
-rw-r--r--  1 laurentperrinet  staff          0 Sep 10 12:15 cache_dir/HULK_EMP_n_dictionary=2704_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff          0 Sep 10 12:15 cache_dir/HULK_EMP_n_dictionary=2704_dico.pkl_lock_pid-47128_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff    3811539 Sep 10 00:54 cache_dir/HULK_EMP_n_dictionary=338_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    5376179 Sep 10 01:45 cache_dir/HULK_EMP_n_dictionary=478_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep 10 02:59 cache_dir/HULK_EMP_n_dictionary=676_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   10718307 Sep 10 05:01 cache_dir/HULK_EMP_n_dictionary=956_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:58 cache_dir/HULK_EMP_seed=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=43_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=44_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:58 cache_dir/HULK_EMP_seed=45_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:58 cache_dir/HULK_EMP_seed=46_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=47_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=48_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=49_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:58 cache_dir/HULK_EMP_seed=50_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 16:59 cache_dir/HULK_EMP_seed=51_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:37 cache_dir/HULK_HAP_eta=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:39 cache_dir/HULK_HAP_eta=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.00200_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:41 cache_dir/HULK_HAP_eta=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.00356_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:43 cache_dir/HULK_HAP_eta=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.00632_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:45 cache_dir/HULK_HAP_eta=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.01125_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:47 cache_dir/HULK_HAP_eta=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.02000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:49 cache_dir/HULK_HAP_eta=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.03557_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:51 cache_dir/HULK_HAP_eta=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:15 cache_dir/HULK_HAP_eta=0.06325_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:53 cache_dir/HULK_HAP_eta=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.11247_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Sep  4 22:16 cache_dir/HULK_HAP_eta_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 14:59 cache_dir/HULK_HAP_l0_sparseness=10_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 15:54 cache_dir/HULK_HAP_l0_sparseness=14_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 17:08 cache_dir/HULK_HAP_l0_sparseness=21_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 18:44 cache_dir/HULK_HAP_l0_sparseness=29_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 20:56 cache_dir/HULK_HAP_l0_sparseness=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 23:54 cache_dir/HULK_HAP_l0_sparseness=59_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 10 13:45 cache_dir/HULK_HAP_l0_sparseness=5_dico.pkl
-rw-r--r--  1 laurentperrinet  staff          0 Sep 10 13:45 cache_dir/HULK_HAP_l0_sparseness=7_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff          0 Sep 10 13:45 cache_dir/HULK_HAP_l0_sparseness=7_dico.pkl_lock_pid-58191_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 11 04:00 cache_dir/HULK_HAP_l0_sparseness=84_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    1921659 Sep 11 04:23 cache_dir/HULK_HAP_n_dictionary=169_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    2703979 Sep 11 04:55 cache_dir/HULK_HAP_n_dictionary=239_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    3810794 Sep 11 05:27 cache_dir/HULK_HAP_n_dictionary=338_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    5375434 Sep 11 06:22 cache_dir/HULK_HAP_n_dictionary=478_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7588282 Sep 11 07:38 cache_dir/HULK_HAP_n_dictionary=676_dico.pkl
-rw-r--r--  1 laurentperrinet  staff          0 Sep 11 07:38 cache_dir/HULK_HAP_n_dictionary=956_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff          0 Sep 11 07:38 cache_dir/HULK_HAP_n_dictionary=956_dico.pkl_lock_pid-58261_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=43_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=44_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 18:06 cache_dir/HULK_HAP_seed=45_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=46_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=47_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=48_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=49_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:39 cache_dir/HULK_HAP_seed=50_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Sep  4 13:38 cache_dir/HULK_HAP_seed=51_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 15 17:41 cache_dir/HULK_HEH_alpha_homeo=0.01250_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 15 17:50 cache_dir/HULK_HEH_alpha_homeo=0.01768_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 06:18 cache_dir/HULK_HEH_alpha_homeo=0.02500_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 06:37 cache_dir/HULK_HEH_alpha_homeo=0.03536_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 12:13 cache_dir/HULK_HEH_alpha_homeo=0.05000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 12:06 cache_dir/HULK_HEH_alpha_homeo=0.07071_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 11:50 cache_dir/HULK_HEH_alpha_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 11:44 cache_dir/HULK_HEH_alpha_homeo=0.14142_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 16 11:55 cache_dir/HULK_HEH_alpha_homeo=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:24 cache_dir/HULK_HEH_eta=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:30 cache_dir/HULK_HEH_eta=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 10 22:42 cache_dir/HULK_HEH_eta=0.00200_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:37 cache_dir/HULK_HEH_eta=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 10 22:33 cache_dir/HULK_HEH_eta=0.00356_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:44 cache_dir/HULK_HEH_eta=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 11 11:44 cache_dir/HULK_HEH_eta=0.00632_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:50 cache_dir/HULK_HEH_eta=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 11 11:53 cache_dir/HULK_HEH_eta=0.01125_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:57 cache_dir/HULK_HEH_eta=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 00:51 cache_dir/HULK_HEH_eta=0.02000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:04 cache_dir/HULK_HEH_eta=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 01:07 cache_dir/HULK_HEH_eta=0.03557_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:10 cache_dir/HULK_HEH_eta=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 12:33 cache_dir/HULK_HEH_eta=0.06325_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 23:17 cache_dir/HULK_HEH_eta=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 12:58 cache_dir/HULK_HEH_eta=0.11247_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:17 cache_dir/HULK_HEH_eta=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:24 cache_dir/HULK_HEH_eta_homeo=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:21 cache_dir/HULK_HEH_eta_homeo=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:38 cache_dir/HULK_HEH_eta_homeo=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:20 cache_dir/HULK_HEH_eta_homeo=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:19 cache_dir/HULK_HEH_eta_homeo=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:09 cache_dir/HULK_HEH_eta_homeo=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:15 cache_dir/HULK_HEH_eta_homeo=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:09 cache_dir/HULK_HEH_eta_homeo=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 22:03 cache_dir/HULK_HEH_eta_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 05:33 cache_dir/HULK_HEH_l0_sparseness=10_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 07:47 cache_dir/HULK_HEH_l0_sparseness=14_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 11:44 cache_dir/HULK_HEH_l0_sparseness=21_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 15:26 cache_dir/HULK_HEH_l0_sparseness=29_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 20:19 cache_dir/HULK_HEH_l0_sparseness=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 17 04:35 cache_dir/HULK_HEH_l0_sparseness=59_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 02:30 cache_dir/HULK_HEH_l0_sparseness=5_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 03:47 cache_dir/HULK_HEH_l0_sparseness=7_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 17 13:04 cache_dir/HULK_HEH_l0_sparseness=84_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   15144003 Aug 17 01:51 cache_dir/HULK_HEH_n_dictionary=1352_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    1922404 Aug 16 13:46 cache_dir/HULK_HEH_n_dictionary=169_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   21402563 Aug 17 05:20 cache_dir/HULK_HEH_n_dictionary=1912_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    2704724 Aug 16 14:36 cache_dir/HULK_HEH_n_dictionary=239_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   30253955 Aug 17 09:51 cache_dir/HULK_HEH_n_dictionary=2704_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    3811539 Aug 16 15:33 cache_dir/HULK_HEH_n_dictionary=338_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    5376179 Aug 16 16:52 cache_dir/HULK_HEH_n_dictionary=478_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 16 21:05 cache_dir/HULK_HEH_n_dictionary=676_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   10718307 Aug 16 22:57 cache_dir/HULK_HEH_n_dictionary=956_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:19 cache_dir/HULK_HEH_seed=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:18 cache_dir/HULK_HEH_seed=43_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:18 cache_dir/HULK_HEH_seed=44_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:20 cache_dir/HULK_HEH_seed=45_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:19 cache_dir/HULK_HEH_seed=46_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:13 cache_dir/HULK_HEH_seed=47_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:23 cache_dir/HULK_HEH_seed=48_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:36 cache_dir/HULK_HEH_seed=49_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:20 cache_dir/HULK_HEH_seed=50_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 11 07:17 cache_dir/HULK_HEH_seed=51_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:33 cache_dir/HULK_None_alpha_homeo=0.01250_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:34 cache_dir/HULK_None_alpha_homeo=0.01768_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:36 cache_dir/HULK_None_alpha_homeo=0.02500_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:36 cache_dir/HULK_None_alpha_homeo=0.03536_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:38 cache_dir/HULK_None_alpha_homeo=0.05000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:36 cache_dir/HULK_None_alpha_homeo=0.07071_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:36 cache_dir/HULK_None_alpha_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:36 cache_dir/HULK_None_alpha_homeo=0.14142_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 13 17:39 cache_dir/HULK_None_alpha_homeo=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:32 cache_dir/HULK_None_eta=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:35 cache_dir/HULK_None_eta=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.00200_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:37 cache_dir/HULK_None_eta=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.00356_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:39 cache_dir/HULK_None_eta=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.00632_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:41 cache_dir/HULK_None_eta=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.01125_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:43 cache_dir/HULK_None_eta=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.02000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:45 cache_dir/HULK_None_eta=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 21:59 cache_dir/HULK_None_eta=0.03557_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:47 cache_dir/HULK_None_eta=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.06325_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007963 Sep  5 21:49 cache_dir/HULK_None_eta=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 21:59 cache_dir/HULK_None_eta=0.11247_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 21:59 cache_dir/HULK_None_eta_homeo=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 21:58 cache_dir/HULK_None_eta_homeo=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 11 22:00 cache_dir/HULK_None_eta_homeo=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:33 cache_dir/HULK_None_eta_homeo=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:34 cache_dir/HULK_None_eta_homeo=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:33 cache_dir/HULK_None_eta_homeo=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:34 cache_dir/HULK_None_eta_homeo=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:34 cache_dir/HULK_None_eta_homeo=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589083 Aug 12 01:34 cache_dir/HULK_None_eta_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 15:25 cache_dir/HULK_None_l0_sparseness=10_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 18:45 cache_dir/HULK_None_l0_sparseness=14_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 20:24 cache_dir/HULK_None_l0_sparseness=21_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 22:14 cache_dir/HULK_None_l0_sparseness=29_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 23:55 cache_dir/HULK_None_l0_sparseness=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 14 01:49 cache_dir/HULK_None_l0_sparseness=59_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 14:26 cache_dir/HULK_None_l0_sparseness=5_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 14:50 cache_dir/HULK_None_l0_sparseness=7_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 14 06:15 cache_dir/HULK_None_l0_sparseness=84_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   15144004 Aug 14 09:45 cache_dir/HULK_None_n_dictionary=1352_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    1922405 Aug 13 22:00 cache_dir/HULK_None_n_dictionary=169_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   21402564 Aug 14 15:12 cache_dir/HULK_None_n_dictionary=1912_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    2704725 Aug 13 23:22 cache_dir/HULK_None_n_dictionary=239_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   30253956 Aug 14 23:48 cache_dir/HULK_None_n_dictionary=2704_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    3811540 Aug 13 21:24 cache_dir/HULK_None_n_dictionary=338_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    5376180 Aug 13 21:17 cache_dir/HULK_None_n_dictionary=478_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 13 22:42 cache_dir/HULK_None_n_dictionary=676_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   10718308 Aug 14 05:50 cache_dir/HULK_None_n_dictionary=956_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:14 cache_dir/HULK_None_seed=43_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:14 cache_dir/HULK_None_seed=44_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=45_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=46_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=47_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=48_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=49_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=50_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589028 Aug 10 12:16 cache_dir/HULK_None_seed=51_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 05:53 cache_dir/HULK_OLS_alpha_homeo=0.01250_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 05:52 cache_dir/HULK_OLS_alpha_homeo=0.01768_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 09:33 cache_dir/HULK_OLS_alpha_homeo=0.02500_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 09:35 cache_dir/HULK_OLS_alpha_homeo=0.03536_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 13:15 cache_dir/HULK_OLS_alpha_homeo=0.05000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 13:17 cache_dir/HULK_OLS_alpha_homeo=0.07071_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 17:00 cache_dir/HULK_OLS_alpha_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 17:02 cache_dir/HULK_OLS_alpha_homeo=0.14142_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 14 20:34 cache_dir/HULK_OLS_alpha_homeo=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 21:52 cache_dir/HULK_OLS_eta=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 21:54 cache_dir/HULK_OLS_eta=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta=0.00200_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 21:56 cache_dir/HULK_OLS_eta=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta=0.00356_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 21:58 cache_dir/HULK_OLS_eta=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta=0.00632_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:07 cache_dir/HULK_OLS_eta=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta=0.01125_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:11 cache_dir/HULK_OLS_eta=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta=0.02000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:13 cache_dir/HULK_OLS_eta=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta=0.03557_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:15 cache_dir/HULK_OLS_eta=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta=0.06325_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    6007962 Sep  5 22:17 cache_dir/HULK_OLS_eta=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta=0.11247_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta=0.20000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta_homeo=0.00100_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:30 cache_dir/HULK_OLS_eta_homeo=0.00178_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 06:29 cache_dir/HULK_OLS_eta_homeo=0.00316_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 09:59 cache_dir/HULK_OLS_eta_homeo=0.00562_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 09:59 cache_dir/HULK_OLS_eta_homeo=0.01000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 10:00 cache_dir/HULK_OLS_eta_homeo=0.01778_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 10:00 cache_dir/HULK_OLS_eta_homeo=0.03162_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 10:00 cache_dir/HULK_OLS_eta_homeo=0.05623_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589082 Aug 12 10:00 cache_dir/HULK_OLS_eta_homeo=0.10000_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 14 22:28 cache_dir/HULK_OLS_l0_sparseness=10_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 14 23:13 cache_dir/HULK_OLS_l0_sparseness=14_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 03:19 cache_dir/HULK_OLS_l0_sparseness=21_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 05:39 cache_dir/HULK_OLS_l0_sparseness=29_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 08:15 cache_dir/HULK_OLS_l0_sparseness=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 10:36 cache_dir/HULK_OLS_l0_sparseness=59_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 14 18:34 cache_dir/HULK_OLS_l0_sparseness=5_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 14 20:19 cache_dir/HULK_OLS_l0_sparseness=7_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 13:53 cache_dir/HULK_OLS_l0_sparseness=84_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   15144003 Aug 15 11:09 cache_dir/HULK_OLS_n_dictionary=1352_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    1922404 Aug 15 02:52 cache_dir/HULK_OLS_n_dictionary=169_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   21402563 Aug 15 14:51 cache_dir/HULK_OLS_n_dictionary=1912_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    2704724 Aug 15 03:58 cache_dir/HULK_OLS_n_dictionary=239_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   30253955 Aug 15 22:36 cache_dir/HULK_OLS_n_dictionary=2704_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    3811539 Aug 15 02:20 cache_dir/HULK_OLS_n_dictionary=338_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    5376179 Aug 15 03:38 cache_dir/HULK_OLS_n_dictionary=478_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 15 05:09 cache_dir/HULK_OLS_n_dictionary=676_dico.pkl
-rw-r--r--  1 laurentperrinet  staff   10718307 Aug 15 07:00 cache_dir/HULK_OLS_n_dictionary=956_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:27 cache_dir/HULK_OLS_seed=42_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:28 cache_dir/HULK_OLS_seed=43_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:29 cache_dir/HULK_OLS_seed=44_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:28 cache_dir/HULK_OLS_seed=45_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:30 cache_dir/HULK_OLS_seed=46_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:30 cache_dir/HULK_OLS_seed=47_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:28 cache_dir/HULK_OLS_seed=48_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:28 cache_dir/HULK_OLS_seed=49_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:30 cache_dir/HULK_OLS_seed=50_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 10 17:29 cache_dir/HULK_OLS_seed=51_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 29 11:47 cache_dir/HULK_OVF_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 29 10:34 cache_dir/HULK_WHITE_dico.pkl
-rw-r--r--  1 laurentperrinet  staff  231154688 Aug 10 07:18 cache_dir/HULK_data.npy
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 29 14:12 cache_dir/HULK_default_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 29 12:59 cache_dir/HULK_fixed_dico.pkl
-rw-r--r--  1 laurentperrinet  staff    7589027 Aug 13 15:27 cache_dir/HULK_vanilla_dico.pkl
In [9]:
!ls -ltr {shl.cache_dir}/{tag}*lock*
-rw-r--r--  1 laurentperrinet  staff  0 Aug 17 02:03 cache_dir/HULK - algorithm=lars_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff  0 Aug 17 02:03 cache_dir/HULK - algorithm=lars_dico.pkl_lock_pid-18082_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff  0 Aug 18 01:12 cache_dir/HULK - algorithm=omp_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff  0 Aug 18 01:12 cache_dir/HULK - algorithm=omp_dico.pkl_lock_pid-20774_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff  0 Sep 10 12:15 cache_dir/HULK_EMP_n_dictionary=2704_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff  0 Sep 10 12:15 cache_dir/HULK_EMP_n_dictionary=2704_dico.pkl_lock_pid-47128_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff  0 Sep 10 13:45 cache_dir/HULK_HAP_l0_sparseness=7_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff  0 Sep 10 13:45 cache_dir/HULK_HAP_l0_sparseness=7_dico.pkl_lock_pid-58191_host-fortytwo
-rw-r--r--  1 laurentperrinet  staff  0 Sep 11 07:38 cache_dir/HULK_HAP_n_dictionary=956_dico.pkl_lock
-rw-r--r--  1 laurentperrinet  staff  0 Sep 11 07:38 cache_dir/HULK_HAP_n_dictionary=956_dico.pkl_lock_pid-58261_host-fortytwo

figure 1: Role of homeostasis in learning sparse representations

In [10]:
fname = 'figure_map'
# we cross-validate with 10 different learnings
one_cv = 3 # and pick one to display intermediate results
one_cv = 8 # and pick one to display intermediate results

learning

The actual learning is done in a second object (here dico) from which we can access another set of properties and functions (see the shl_learn.py script):

In [11]:
homeo_methods = ['None', 'OLS', 'HEH']

list_figures = ['show_dico', 'time_plot_error', 'time_plot_logL', 'time_plot_MC', 'show_Pcum']
list_figures = []
dico = {}
for i_cv in range(N_cv):
    dico[i_cv] = {}
    for homeo_method in homeo_methods:
        shl = SHL(homeo_method=homeo_method, seed=seed+i_cv, **opts)
        dico[i_cv][homeo_method] = shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_seed=' + str(seed+i_cv))
In [12]:
list_figures = ['show_dico']
for i_cv in [one_cv]:
    for homeo_method in homeo_methods:
        print(hl + hs + homeo_method[:3] + hs + hl)
        shl = SHL(homeo_method=homeo_method, seed=seed+i_cv, **opts)
        shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_seed=' + str(seed+i_cv))

        print('size of dictionary = (number of filters, size of imagelets) = ', dico[i_cv][homeo_method].dictionary.shape)
        print('average of filters = ',  dico[i_cv][homeo_method].dictionary.mean(axis=1).mean(), 
              '+/-',  dico[i_cv][homeo_method].dictionary.mean(axis=1).std())
        SE = np.sqrt(np.sum(dico[i_cv][homeo_method].dictionary**2, axis=1))
        print('average energy of filters = ', SE.mean(), '+/-', SE.std())
        plt.show()
----------          Non          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  -7.038410893224767e-06 +/- 0.0008419021277793694
average energy of filters =  1.0 +/- 3.866729645080236e-17
----------          OLS          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  1.8871620517154972e-05 +/- 0.0007995474521857563
average energy of filters =  1.0 +/- 4.0734048673293375e-17
----------          HEH          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  -2.9411542274321333e-05 +/- 0.0008106530645520307
average energy of filters =  1.0 +/- 4.312578046109635e-17

panel A: plotting some dictionaries

In [13]:
pname = '/tmp/panel_A' #pname = fname + '_A'
In [14]:
from shl_scripts import show_dico
if DEBUG: show_dico(shl, dico[one_cvi_cv][homeo_method], data=data, dim_graph=(2,5))
In [15]:
dim_graph = (2, 9)
colors = ['black', 'orange', 'blue']
homeo_methods
Out[15]:
['None', 'OLS', 'HEH']
In [16]:
%run model.py
tag = HULK
n_jobs = 0
<Figure size 432x288 with 0 Axes>
In [17]:
subplotpars = dict(left=0.042, right=1., bottom=0., top=1., wspace=0.05, hspace=0.05,)
fig, axs = plt.subplots(3, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)

for ax, color, homeo_method in zip(axs.ravel(), colors, homeo_methods): 
    ax.axis(c=color, lw=2, axisbg='w')
    ax.set_facecolor('w')
    fig, ax = show_dico(shl, dico[one_cv][homeo_method], data=data, dim_graph=dim_graph, fig=fig, ax=ax)
    # ax.set_ylabel(homeo_method)
    ax.text(-10, 29, homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white'

for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
In [18]:
### TODO put the p_min an p_max value in the filter map
In [19]:
if DEBUG: Image(pname +'.png')
In [20]:
if DEBUG: help(fig.subplots_adjust)
In [21]:
if DEBUG: help(plt.subplots)
In [22]:
if DEBUG: help(matplotlib.gridspec.GridSpec)

panel B: quantitative comparison

In [23]:
pname = '/tmp/panel_B' #fname + '_B'
In [24]:
Flim1, Flim2 = .475, .626
In [25]:
from shl_scripts import time_plot
variable = 'F'
alpha_0, alpha = .3, .15
subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95)#, wspace=0.05, hspace=0.05,)
fig, ax = plt.subplots(1, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)
for i_cv in range(N_cv):
    for color, homeo_method in zip(colors, homeo_methods): 
        ax.axis(c='b', lw=2, axisbg='w')
        ax.set_facecolor('w')
        if i_cv==0:
            fig, ax = time_plot(shl, dico[i_cv][homeo_method], variable=variable, unit='bits', color=color, label=homeo_method, alpha=alpha_0, fig=fig, ax=ax)
        else:
            fig, ax = time_plot(shl, dico[i_cv][homeo_method], variable=variable, unit='bits', color=color, alpha=alpha, fig=fig, ax=ax)        
        # ax.set_ylabel(homeo_method)
        #ax.text(-8, 7*dim_graph[0], homeo_method, fontsize=12, color='k', rotation=90)#, backgroundcolor='white'
ax.legend(loc='best')
ax.set_ylim(Flim1, Flim2)
for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
if DEBUG: Image(pname +'.png')

Montage of the subplots

In [26]:
import tikzmagic
In [27]:
%load_ext tikzmagic
In [28]:
#DEBUG = True
if DEBUG: help(tikzmagic)
%tikz \draw (0,0) rectangle (1,1);%%tikz --save {fname}.pdf \draw[white, fill=white] (0.\linewidth,0) rectangle (1.\linewidth, .382\linewidth) ;
In [29]:
%%tikz -f pdf --save {fname}.pdf
\draw[white, fill=white] (0.\linewidth,0) rectangle (1.\linewidth, .382\linewidth) ;
\draw [anchor=north west] (.0\linewidth, .382\linewidth) node {\includegraphics[width=.5\linewidth]{/tmp/panel_A}};
\draw [anchor=north west] (.5\linewidth, .382\linewidth) node {\includegraphics[width=.5\linewidth]{/tmp/panel_B}};
\begin{scope}[font=\bf\sffamily\large]
\draw [anchor=west,fill=white] (.0\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{A}$};
\draw [anchor=west,fill=white] (.53\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{B}$};
\end{scope}
In [30]:
!convert  -density {dpi_export} {fname}.pdf {fname}.jpg
!convert  -density {dpi_export} {fname}.pdf {fname}.png
#!convert  -density {dpi_export} -resize 5400  -units pixelsperinch -flatten  -compress lzw  -depth 8 {fname}.pdf {fname}.tiff
Image(fname +'.png')
Out[30]:
!echo "width=" ; convert {fname}.tiff -format "%[fx:w]" info: !echo ", \nheight=" ; convert {fname}.tiff -format "%[fx:h]" info: !echo ", \nunit=" ; convert {fname}.tiff -format "%U" info:!identify {fname}.tiff

figure 2: Histogram Equalization Homeostasis

In [31]:
fname = 'figure_HEH'

First collecting data:

In [32]:
list_figures = ['show_Pcum']

dico = {}
for homeo_method in homeo_methods:
    print(hl + hs + homeo_method + hs + hl)
    shl = SHL(homeo_method=homeo_method, **opts)
    #dico[homeo_method] = shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_' + str(one_cv))
    dico[homeo_method] = shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_seed=' + str(seed+one_cv))
    plt.show()
----------          None          ----------
----------          OLS          ----------
----------          HEH          ----------
----------          HAP          ----------
----------          EMP          ----------
In [33]:
dico[homeo_method].P_cum.shape
Out[33]:
(676, 128)

panel A: different P_cum

In [34]:
pname = '/tmp/panel_A' #pname = fname + '_A'

from shl_scripts import plot_P_cum
variable = 'F'
subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95)#, wspace=0.05, hspace=0.05,)
fig, ax = plt.subplots(1, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)
for color, homeo_method in zip(colors, homeo_methods): 
    ax.axis(c='b', lw=2, axisbg='w')
    ax.set_facecolor('w')
    fig, ax = plot_P_cum(dico[homeo_method].P_cum, ymin=0.93, ymax=1.001, 
                         title=None, suptitle=None, ylabel='non-linear functions', 
                         verbose=False, n_yticks=21, alpha=.02, c=color, fig=fig, ax=ax)
    ax.plot([0], [0], lw=1, color=color, label=homeo_method, alpha=.6)
    # ax.set_ylabel(homeo_method)
    #ax.text(-8, 7*dim_graph[0], homeo_method, fontsize=12, color='k', rotation=90)#, backgroundcolor='white'
ax.legend(loc='lower right')
for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
if DEBUG: Image(pname +'.png')
In [35]:
if DEBUG: help(fig.legend)

panel B: comparing the effects of parameters

In [36]:
pname = '/tmp/panel_B' #fname + '_B'
n_jobs = 1

from shl_scripts.shl_experiments import SHL_set
homeo_methods = ['None', 'OLS', 'HEH']
variables = ['eta', 'eta_homeo']
#latex_variables = [r'$\eta$', r'$\eta_\textnormal{homeo}$']

list_figures = []

for homeo_method in homeo_methods:
    opts_ = opts.copy()
    opts_.update(homeo_method=homeo_method)
    experiments = SHL_set(opts_, tag=tag + '_' + homeo_method, base=10)
    experiments.run(variables=variables, n_jobs=n_jobs, verbose=0)

import matplotlib.pyplot as plt
subplotpars = dict(left=0.2, right=.95, bottom=0.05, top=.95, wspace=0.5, hspace=0.6,)

x, y = .05, .8 #-.3

fig, axs = plt.subplots(len(variables), 1, figsize=(fig_width/2, fig_width/(1.3+phi)), gridspec_kw=subplotpars, sharey=True)

for i_ax, variable in enumerate(variables):
    for color, homeo_method in zip(colors, homeo_methods): 
        opts_ = opts.copy()
        opts_.update(homeo_method=homeo_method)
        experiments = SHL_set(opts_, tag=tag + '_' + homeo_method, base=10)
        fig, axs[i_ax] = experiments.scan(variable=variable, list_figures=[], display='final', fig=fig, ax=axs[i_ax], color=color, display_variable='F', verbose=0) #, label=homeo_metho
        #axs[i_ax].set_xlabel(latex_variables[i_ax]) #variable
        #axs[i_ax].text(x, y,  variable, transform=axs[i_ax].transAxes) 
        #axs[i_ax].get_xaxis().set_major_formatter(matplotlib.ticker.ScalarFormatter())
    axs[i_ax].set_ylim(Flim1, Flim2)

axs[0].xaxis.set_label_coords(0.5,-.325)        
#fig.legend(loc='lower right')
for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
if DEBUG: Image(pname +'.png')

Montage of the subplots

In [37]:
%%tikz -f pdf --save {fname}.pdf
\draw[white, fill=white] (0.\linewidth,0) rectangle (1.\linewidth, .382\linewidth) ;
\draw [anchor=north west] (.0\linewidth, .382\linewidth) node {\includegraphics[width=.5\linewidth]{/tmp/panel_A.pdf}};
\draw [anchor=north west] (.5\linewidth, .382\linewidth) node {\includegraphics[width=.465\linewidth]{/tmp/panel_B.pdf}};
\begin{scope}[font=\bf\sffamily\large]
\draw [anchor=west,fill=white] (.0\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{A}$};
\draw [anchor=west,fill=white] (.53\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{B}$};
\end{scope}
In [38]:
!convert  -density {dpi_export} {fname}.pdf {fname}.jpg
!convert  -density {dpi_export} {fname}.pdf {fname}.png
#!convert  -density {dpi_export} -resize 5400  -units pixelsperinch -flatten  -compress lzw  -depth 8 {fname}.pdf {fname}.tiff
Image(fname +'.png')
Out[38]:
!echo "width=" ; convert {fname}.tiff -format "%[fx:w]" info: !echo ", \nheight=" ; convert {fname}.tiff -format "%[fx:h]" info: !echo ", \nunit=" ; convert {fname}.tiff -format "%U" info:!identify {fname}.tiff

figure 3:

learning

In [39]:
fname = 'figure_HAP'
In [40]:
colors = ['orange', 'blue', 'red', 'green']
homeo_methods = ['OLS', 'HEH', 'EMP', 'HAP']
list_figures = []
dico = {}
for i_cv in range(N_cv):
    dico[i_cv] = {}
    for homeo_method in homeo_methods:
        shl = SHL(homeo_method=homeo_method, seed=seed+i_cv, **opts)
        dico[i_cv][homeo_method] = shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_seed=' + str(seed+i_cv))

list_figures = ['show_dico'] if DEBUG else []
for i_cv in [one_cv]:
    for homeo_method in homeo_methods:
        print(hl + hs + homeo_method + hs + hl)
        shl = SHL(homeo_method=homeo_method, seed=seed+i_cv, **opts)
        shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_' + homeo_method + '_seed=' + str(seed+i_cv))
        plt.show()
        print('size of dictionary = (number of filters, size of imagelets) = ', dico[i_cv][homeo_method].dictionary.shape)
        print('average of filters = ',  dico[i_cv][homeo_method].dictionary.mean(axis=1).mean(), 
              '+/-',  dico[i_cv][homeo_method].dictionary.mean(axis=1).std())
        SE = np.sqrt(np.sum(dico[i_cv][homeo_method].dictionary**2, axis=1))
        print('average energy of filters = ', SE.mean(), '+/-', SE.std())
----------          OLS          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  1.8871620517154972e-05 +/- 0.0007995474521857563
average energy of filters =  1.0 +/- 4.0734048673293375e-17
----------          HEH          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  -2.9411542274321333e-05 +/- 0.0008106530645520307
average energy of filters =  1.0 +/- 4.312578046109635e-17
----------          EMP          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  -4.87013859105051e-05 +/- 0.0008528193053604389
average energy of filters =  1.0 +/- 4.437606199346686e-17
----------          HAP          ----------
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  -2.1390427270893857e-05 +/- 0.0008757678879900148
average energy of filters =  1.0 +/- 4.333666573072855e-17

panel A: plotting some dictionaries

In [41]:
pname = '/tmp/panel_A' #pname = fname + '_A'
In [42]:
subplotpars = dict( left=0.042, right=1., bottom=0., top=1., wspace=0.05, hspace=0.05,)
fig, axs = plt.subplots(3, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)

for ax, color, homeo_method in zip(axs.ravel(), colors[1:], homeo_methods[1:]): 
    ax.axis(c=color, lw=2, axisbg='w')
    ax.set_facecolor('w')
    from shl_scripts import show_dico
    fig, ax = show_dico(shl, dico[one_cv][homeo_method], data=data, dim_graph=dim_graph, fig=fig, ax=ax)
    # ax.set_ylabel(homeo_method)
    ax.text(-10, 29, homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white'

for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')

panel B: quantitative comparison

In [43]:
pname = '/tmp/panel_B' #fname + '_B'
In [44]:
from shl_scripts import time_plot
variable = 'F'
alpha = .3
subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95)#, wspace=0.05, hspace=0.05,)
fig, ax = plt.subplots(1, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)
for i_cv in range(N_cv):
    for color, homeo_method in zip(colors, homeo_methods): 
        ax.axis(c='b', lw=2, axisbg='w')
        ax.set_facecolor('w')
        if i_cv==0:
            fig, ax = time_plot(shl, dico[i_cv][homeo_method], variable=variable, unit='bits', color=color, label=homeo_method, alpha=alpha_0, fig=fig, ax=ax)
        else:
            fig, ax = time_plot(shl, dico[i_cv][homeo_method], variable=variable, unit='bits', color=color, alpha=alpha, fig=fig, ax=ax)        
ax.legend(loc='best')
ax.set_ylim(Flim1, Flim2)
for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
if DEBUG: Image(pname +'.png')    
In [45]:
if DEBUG: Image(pname +'.png')

Montage of the subplots

In [46]:
%%tikz -f pdf --save {fname}.pdf
\draw[white, fill=white] (0.\linewidth,0) rectangle (1.\linewidth, .382\linewidth) ;
\draw [anchor=north west] (.0\linewidth, .382\linewidth) node {\includegraphics[width=.5\linewidth]{/tmp/panel_A}};
\draw [anchor=north west] (.5\linewidth, .382\linewidth) node {\includegraphics[width=.5\linewidth]{/tmp/panel_B}};
\begin{scope}[font=\bf\sffamily\large]
\draw [anchor=west,fill=white] (.0\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{A}$};
\draw [anchor=west,fill=white] (.53\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{B}$};
\end{scope}
In [47]:
!convert  -density {dpi_export} {fname}.pdf {fname}.jpg
!convert  -density {dpi_export} {fname}.pdf {fname}.png
#!convert  -density {dpi_export} -resize 5400  -units pixelsperinch -flatten  -compress lzw  -depth 8 {fname}.pdf {fname}.tiff
Image(fname +'.png')
Out[47]:
!echo "width=" ; convert {fname}.tiff -format "%[fx:w]" info: !echo ", \nheight=" ; convert {fname}.tiff -format "%[fx:h]" info: !echo ", \nunit=" ; convert {fname}.tiff -format "%U" info:!identify {fname}.tiff

As a control, we compare the methods for different parameters:

In [48]:
list_figures = []

for homeo_method in homeo_methods:
    opts_ = opts.copy()
    opts_.update(homeo_method=homeo_method)
    experiments = SHL_set(opts_, tag=tag + '_' + homeo_method, base=10)
    experiments.run(variables=variables, n_jobs=n_jobs, verbose=0)

import matplotlib.pyplot as plt
subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95, wspace=0.5, hspace=0.35,)

x, y = .05, .8 #-.3
UP = 3
fig, axs = plt.subplots(len(variables), 1, figsize=(UP*fig_width/2, UP*fig_width/(1+phi)), gridspec_kw=subplotpars, sharey=True)

for i_ax, variable in enumerate(variables):
    for color, homeo_method in zip(colors, homeo_methods): 
        opts_ = opts.copy()
        opts_.update(homeo_method=homeo_method)
        experiments = SHL_set(opts_, tag=tag + '_' + homeo_method, base=10)
        fig, axs[i_ax] = experiments.scan(variable=variable, list_figures=[], display='final', fig=fig, ax=axs[i_ax], color=color, display_variable='F', verbose=0, label=homeo_method) #
        axs[i_ax].set_xlabel('') #variable
        axs[i_ax].text(x, y,  variable, transform=axs[i_ax].transAxes) 
        #axs[i_ax].get_xaxis().set_major_formatter(matplotlib.ticker.ScalarFormatter())

ax.set_ylim(Flim1, Flim2)
fig.legend(loc='lower right')
Out[48]:
<matplotlib.legend.Legend at 0x1d16bb190>

figure 4: Convolutional Neural Network

In [49]:
fname = 'figure_CNN'
In [50]:
!rm -fr /tmp/database/Face_DataBase
!mkdir -p /tmp/database && rsync -a "/Users/laurentperrinet/science/VB_These/Rapport d'avancement/database/Face_DataBase" /tmp/database/
#!mkdir -p /tmp/database/ && rsync -a "/Users/laurentperrinet/science/VB_These/Rapport d'avancement/database/Face_DataBase/Raw_DataBase/*" /tmp/database/Face_DataBase
In [51]:
from CHAMP.DataLoader import LoadData
from CHAMP.DataTools import LocalContrastNormalization, FilterInputData, GenerateMask
from CHAMP.Monitor import DisplayDico, DisplayConvergenceCHAMP, DisplayWhere

import os
datapath = os.path.join("/tmp", "database")
path = os.path.join(datapath, "Face_DataBase/Raw_DataBase")
TrSet, TeSet = LoadData('Face', path, decorrelate=False, resize=(65, 65))

# MP Parameters
nb_dico = 20
width = 9
dico_size = (width, width)
l0 = 20
seed = 42
# Learning Parameters
eta = .05
nb_epoch = 500

TrSet, TeSet = LoadData('Face', path, decorrelate=False, resize=(65, 65))
N_TrSet, _, _, _ = LocalContrastNormalization(TrSet)
Filtered_L_TrSet = FilterInputData(
    N_TrSet, sigma=0.25, style='Custom', start_R=15)

mask = GenerateMask(full_size=(nb_dico, 1, width, width), sigma=0.8, style='Gaussian')

from CHAMP.CHAMP_Layer import CHAMP_Layer

from CHAMP.DataTools import SaveNetwork, LoadNetwork
homeo_methods = ['None', 'HAP']

for homeo_method, eta_homeo  in zip(homeo_methods, [0., 0.0025]):
    ffname = 'cache_dir_CNN/CHAMP_low_' + homeo_method + '.pkl'
    try:
        L1_mask = LoadNetwork(loading_path=ffname)
    except:
        L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico,
                          dico_size=dico_size, mask=mask, verbose=1)
        dico_mask = L1_mask.TrainLayer(
            Filtered_L_TrSet, eta=eta, eta_homeo=eta_homeo, nb_epoch=nb_epoch, seed=seed)
        SaveNetwork(Network=L1_mask, saving_path=ffname)

panel A: plotting some dictionaries

In [52]:
pname = '/tmp/panel_A' #pname = fname + '_A'
subplotpars = dict( left=0.042, right=1., bottom=0., top=1., wspace=0.05, hspace=0.05,) fig, axs = plt.subplots(2, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars) for ax, color, homeo_method in zip(axs.ravel(), ['black', 'green'], homeo_methods): ax.axis(c=color, lw=2, axisbg='w') ax.set_facecolor('w') ffname = 'cache_dir/CHAMP_low_' + homeo_method + '.pkl' L1_mask = LoadNetwork(loading_path=ffname) fig, ax = DisplayDico(L1_mask.dictionary, fig=fig, ax=ax) # ax.set_ylabel(homeo_method) ax.text(-8, 7*dim_graph[0], homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white' for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight')
In [53]:
subplotpars = dict(left=0.042, right=1., bottom=0., top=1., wspace=0.05, hspace=0.05,)

for color, homeo_method in zip(['black', 'green'], homeo_methods): 
    #fig, axs = plt.subplots(1, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)
    ffname = 'cache_dir_CNN/CHAMP_low_' + homeo_method + '.pkl'
    L1_mask = LoadNetwork(loading_path=ffname)
    fig, ax = DisplayDico(L1_mask.dictionary)
    # ax.set_ylabel(homeo_method)
    #for ax in list(axs):
    #    ax.axis(c=color, lw=2, axisbg='w')
    #    ax.set_facecolor('w')
    ax[0].text(-5, 6, homeo_method, fontsize=8, color=color, rotation=90)#, backgroundcolor='white'
    plt.tight_layout( pad=0., w_pad=0., h_pad=.0)


    for ext in FORMATS: fig.savefig(pname + '_' + homeo_method + ext, dpi=dpi_export, bbox_inches='tight')
<Figure size 576x28.8 with 0 Axes>
<Figure size 576x28.8 with 0 Axes>

panel B: quantitative comparison

In [54]:
pname = '/tmp/panel_B' #fname + '_B'
from shl_scripts import time_plot variable = 'F' alpha = .3 subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95)#, wspace=0.05, hspace=0.05,) fig, axs = plt.subplots(2, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars) for ax, color, homeo_method in zip(axs, ['black', 'green'], homeo_methods): print(ax, axs) ffname = 'cache_dir_CNN/CHAMP_low_' + homeo_method + '.pkl' L1_mask = LoadNetwork(loading_path=ffname) fig, ax = DisplayConvergenceCHAMP(L1_mask, to_display=['histo'], fig=fig, ax=ax) ax.axis(c=color, lw=2, axisbg='w') ax.set_facecolor('w') # ax.set_ylabel(homeo_method) #ax.text(-8, 7*dim_graph[0], homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white' for ext in FORMATS: fig.savefig(pname + ext, dpi=dpi_export, bbox_inches='tight') if DEBUG: Image(pname +'.png')
In [55]:
from shl_scripts import time_plot
variable = 'F'
alpha = .3
subplotpars = dict(left=0.2, right=.95, bottom=0.2, top=.95)#, wspace=0.05, hspace=0.05,)

for color, homeo_method in zip(['black', 'green'], homeo_methods): 
    #fig, axs = plt.subplots(1, 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars)
    ffname = 'cache_dir_CNN/CHAMP_low_' + homeo_method + '.pkl'
    L1_mask = LoadNetwork(loading_path=ffname)
    fig, ax = DisplayConvergenceCHAMP(L1_mask, to_display=['histo'], color=color)
    ax.axis(c=color, lw=2, axisbg='w')
    ax.set_facecolor('w')
    ax.set_ylabel('counts')
    ax.set_xlabel('feature #')
    ax.set_ylim(0, 560)
    #ax.text(-8, 7*dim_graph[0], homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white'
    #ax[0].text(-8, 3, homeo_method, fontsize=12, color=color, rotation=90)#, backgroundcolor='white'
    fig.suptitle(f'method={homeo_method}', y=1.15, fontsize=12)
    for ext in FORMATS: fig.savefig(pname + '_' + homeo_method + ext, dpi=dpi_export, bbox_inches='tight')
    if DEBUG: Image(pname +'.png')    

Montage of the subplots

In [56]:
%ls -ltr /tmp/panel_*
-rw-r--r--  1 laurentperrinet  wheel   73205 Sep 11 09:31 /tmp/panel_A.pdf
-rw-r--r--  1 laurentperrinet  wheel   86792 Sep 11 09:31 /tmp/panel_A.png
-rw-r--r--  1 laurentperrinet  wheel   49318 Sep 11 09:31 /tmp/panel_B.pdf
-rw-r--r--  1 laurentperrinet  wheel  519882 Sep 11 09:31 /tmp/panel_B.png
-rw-r--r--  1 laurentperrinet  wheel   27989 Sep 11 09:31 /tmp/panel_A_None.pdf
-rw-r--r--  1 laurentperrinet  wheel   21876 Sep 11 09:31 /tmp/panel_A_None.png
-rw-r--r--  1 laurentperrinet  wheel   29692 Sep 11 09:31 /tmp/panel_A_HAP.pdf
-rw-r--r--  1 laurentperrinet  wheel   19410 Sep 11 09:31 /tmp/panel_A_HAP.png
-rw-r--r--  1 laurentperrinet  wheel   10208 Sep 11 09:31 /tmp/panel_B_None.pdf
-rw-r--r--  1 laurentperrinet  wheel   64556 Sep 11 09:31 /tmp/panel_B_None.png
-rw-r--r--  1 laurentperrinet  wheel   10679 Sep 11 09:31 /tmp/panel_B_HAP.pdf
-rw-r--r--  1 laurentperrinet  wheel   64318 Sep 11 09:31 /tmp/panel_B_HAP.png
In [57]:
%%tikz -f pdf --save {fname}.pdf
\draw[white, fill=white] (0.\linewidth,0) rectangle (1.\linewidth, .382\linewidth) ;
\draw [anchor=north west] (.0\linewidth, .375\linewidth) node {\includegraphics[width=.95\linewidth]{/tmp/panel_A_None}};
\draw [anchor=north west] (.0\linewidth, .300\linewidth) node {\includegraphics[width=.95\linewidth]{/tmp/panel_A_HAP}};
\draw [anchor=north west] (.0\linewidth, .191\linewidth) node {\includegraphics[width=.45\linewidth]{/tmp/panel_B_None}};
\draw [anchor=north west] (.5\linewidth, .191\linewidth) node {\includegraphics[width=.45\linewidth]{/tmp/panel_B_HAP}};
\begin{scope}[font=\bf\sffamily\large]
%\draw [anchor=west,fill=white] (.0\linewidth, .382\linewidth) node [above right=-3mm] {$\mathsf{A}$};
\draw [anchor=west,fill=white] (.0\linewidth, .191\linewidth) node [above right=-3mm] {$\mathsf{A}$};
\draw [anchor=west,fill=white] (.53\linewidth, .191\linewidth) node [above right=-3mm] {$\mathsf{B}$};
\end{scope}
In [58]:
!convert  -density {dpi_export} {fname}.pdf {fname}.jpg
!convert  -density {dpi_export} {fname}.pdf {fname}.png
#!convert  -density {dpi_export} -resize 5400  -units pixelsperinch -flatten  -compress lzw  -depth 8 {fname}.pdf {fname}.tiff
Image(fname +'.png')
Out[58]:
!echo "width=" ; convert {fname}.tiff -format "%[fx:w]" info: !echo ", \nheight=" ; convert {fname}.tiff -format "%[fx:h]" info: !echo ", \nunit=" ; convert {fname}.tiff -format "%U" info:!identify {fname}.tiff

coding

The learning itself is done via a gradient descent but is highly dependent on the coding / decoding algorithm. This belongs to a another function (in the shl_encode.py script)

Supplementary controls

starting a learning

In [59]:
shl = SHL(**opts)
list_figures = ['show_dico', 'show_Pcum', 'time_plot_F']
dico = shl.learn_dico(data=data, list_figures=list_figures, matname=tag + '_vanilla')
In [60]:
print('size of dictionary = (number of filters, size of imagelets) = ', dico.dictionary.shape)
print('average of filters = ',  dico.dictionary.mean(axis=1).mean(), 
      '+/-',  dico.dictionary.mean(axis=1).std())
SE = np.sqrt(np.sum(dico.dictionary**2, axis=1))
print('average energy of filters = ', SE.mean(), '+/-', SE.std())
size of dictionary = (number of filters, size of imagelets) =  (676, 441)
average of filters =  2.535468337218861e-05 +/- 0.0008451207321658774
average energy of filters =  1.0 +/- 3.959917221265013e-17

getting help

In [61]:
help(shl)
Help on SHL in module shl_scripts.shl_experiments object:

class SHL(builtins.object)
 |  SHL(height=256, width=256, patch_width=21, N_patches=65536, datapath='../database/', name_database='kodakdb', do_mask=True, do_bandpass=True, over_patches=16, patch_ds=1, n_dictionary=676, learning_algorithm='mp', fit_tol=None, l0_sparseness=21, alpha_MP=0.95, one_over_F=True, n_iter=4097, eta=0.02, beta1=0.99, beta2=0.99, epsilon=10, do_precision=False, eta_precision=0.0, homeo_method='HAP', eta_homeo=0.01, alpha_homeo=0.05, C=3.0, nb_quant=128, P_cum=None, do_sym=False, seed=42, patch_norm=False, batch_size=4096, record_each=32, record_num_batches=1024, n_image=None, DEBUG_DOWNSCALE=1, verbose=0, cache_dir='cache_dir')
 |  
 |  Base class to define SHL experiments:
 |      - initialization
 |      - coding and learning
 |      - visualization
 |      - quantitative analysis
 |  
 |  Methods defined here:
 |  
 |  __init__(self, height=256, width=256, patch_width=21, N_patches=65536, datapath='../database/', name_database='kodakdb', do_mask=True, do_bandpass=True, over_patches=16, patch_ds=1, n_dictionary=676, learning_algorithm='mp', fit_tol=None, l0_sparseness=21, alpha_MP=0.95, one_over_F=True, n_iter=4097, eta=0.02, beta1=0.99, beta2=0.99, epsilon=10, do_precision=False, eta_precision=0.0, homeo_method='HAP', eta_homeo=0.01, alpha_homeo=0.05, C=3.0, nb_quant=128, P_cum=None, do_sym=False, seed=42, patch_norm=False, batch_size=4096, record_each=32, record_num_batches=1024, n_image=None, DEBUG_DOWNSCALE=1, verbose=0, cache_dir='cache_dir')
 |      Initialize self.  See help(type(self)) for accurate signature.
 |  
 |  code(self, data, dico, coding_algorithm='mp', matname=None, P_cum=None, fit_tol=None, l0_sparseness=None, gain=None)
 |  
 |  decode(self, sparse_code, dico)
 |  
 |  get_data(self, matname=None, patch_width=None)
 |  
 |  learn_dico(self, dictionary=None, precision=None, P_cum=None, data=None, matname=None, record_each=None, folder_exp=None, list_figures=[], fig_kwargs={'fig': None, 'ax': None})
 |  
 |  plot_error(self, dico, **fig_kwargs)
 |  
 |  plot_variance(self, sparse_code, **fig_kwargs)
 |  
 |  plot_variance_histogram(self, sparse_code, **fig_kwargs)
 |  
 |  show_Pcum(self, dico, title=None, verbose=False, n_yticks=21, alpha=0.05, c='g', **fig_kwargs)
 |  
 |  show_dico(self, dico, data=None, title=None, **fig_kwargs)
 |  
 |  show_dico_in_order(self, dico, data=None, title=None, **fig_kwargs)
 |  
 |  time_plot(self, dico, variable='kurt', N_nosample=1, **fig_kwargs)
 |  
 |  ----------------------------------------------------------------------
 |  Data descriptors defined here:
 |  
 |  __dict__
 |      dictionary for instance variables (if defined)
 |  
 |  __weakref__
 |      list of weak references to the object (if defined)

In [62]:
help(dico)
Help on SparseHebbianLearning in module shl_scripts.shl_learn object:

class SparseHebbianLearning(builtins.object)
 |  SparseHebbianLearning(fit_algorithm, dictionary=None, precision=None, eta=0.003, beta1=0.9, beta2=0.999, epsilon=8, homeo_method='HEH', eta_homeo=0.05, alpha_homeo=0.0, C=5.0, nb_quant=256, P_cum=None, n_dictionary=None, n_iter=10000, batch_size=32, l0_sparseness=None, fit_tol=None, alpha_MP=1.0, do_precision=False, eta_precision=0.01, do_sym=False, record_each=200, record_num_batches=4096, verbose=False, one_over_F=True)
 |  
 |  Sparse Hebbian learning
 |  
 |  Finds a dictionary (a set of atoms) that can best be used to represent data
 |  using a sparse code.
 |  
 |  Parameters
 |  ----------
 |  
 |  n_dictionary : int,
 |      Number of dictionary elements to extract
 |  
 |  eta : float or dict
 |      Gives the learning parameter for the homeostatic gain.
 |  
 |  n_iter : int,
 |      total number of iterations to perform
 |  
 |  eta_homeo : float
 |      Gives the learning parameter for the homeostatic gain.
 |  
 |  alpha_homeo : float
 |      Gives the smoothing exponent  for the homeostatic gain
 |      If equal to 1 the homeostatic learning rule learns a linear relation to
 |      variance.
 |  
 |  dictionary : array of shape (n_dictionary, n_pixels),
 |      initial value of the dictionary for warm restart scenarios
 |      Use ``None`` for a new learning.
 |  
 |  fit_algorithm : {'mp', 'lars', 'cd'}
 |      see sparse_encode
 |  
 |  batch_size : int,
 |      The number of samples to take in each batch.
 |  
 |  l0_sparseness : int, ``0.1 * n_pixels`` by default
 |      Number of nonzero coefficients to target in each column of the
 |      solution. This is only used by `algorithm='lars'`, `algorithm='mp'`  and
 |      `algorithm='omp'`.
 |  
 |  fit_tol : float, 1. by default
 |      If `algorithm='lasso_lars'` or `algorithm='lasso_cd'`, `fit_tol` is the
 |      penalty applied to the L1 norm.
 |      If `algorithm='threshold'`, `fit_tol` is the absolute value of the
 |      threshold below which coefficients will be squashed to zero.
 |      If `algorithm='mp'` or `algorithm='omp'`, `fit_tol` is the tolerance
 |      parameter: the value of the reconstruction error targeted. In this case,
 |      it overrides `l0_sparseness`.
 |  
 |  verbose :
 |      degree of verbosity of the printed output
 |  
 |  Attributes
 |  ----------
 |  dictionary : array, [n_dictionary, n_pixels]
 |      dictionary extracted from the data
 |  
 |  
 |  Notes
 |  -----
 |  **References:**
 |  
 |  Olshausen BA, Field DJ (1996).
 |  Emergence of simple-cell receptive field properties by learning a sparse code for natural images.
 |  Nature, 381: 607-609. (http://redwood.berkeley.edu/bruno/papers/nature-paper.pdf)
 |  
 |  Olshausen BA, Field DJ (1997)
 |  Sparse Coding with an Overcomplete Basis Set: A Strategy Employed by V1?
 |  Vision Research, 37: 3311-3325.   (http://redwood.berkeley.edu/bruno/papers/VR.pdf)
 |  
 |  See also
 |  --------
 |  http://scikit-learn.org/stable/auto_examples/decomposition/plot_image_denoising.html
 |  
 |  Methods defined here:
 |  
 |  __init__(self, fit_algorithm, dictionary=None, precision=None, eta=0.003, beta1=0.9, beta2=0.999, epsilon=8, homeo_method='HEH', eta_homeo=0.05, alpha_homeo=0.0, C=5.0, nb_quant=256, P_cum=None, n_dictionary=None, n_iter=10000, batch_size=32, l0_sparseness=None, fit_tol=None, alpha_MP=1.0, do_precision=False, eta_precision=0.01, do_sym=False, record_each=200, record_num_batches=4096, verbose=False, one_over_F=True)
 |      Initialize self.  See help(type(self)) for accurate signature.
 |  
 |  fit(self, X, y=None)
 |      Fit the model from data in X.
 |      
 |      Parameters
 |      ----------
 |      X: array-like, shape (n_samples, n_pixels)
 |          Training vector, where n_samples in the number of samples
 |          and n_pixels is the number of features.
 |      
 |      Returns
 |      -------
 |      self : object
 |          Returns the instance itself.
 |  
 |  transform(self, X, algorithm=None, l0_sparseness=None, fit_tol=None, alpha_MP=None)
 |      Fit the model from data in X.
 |      
 |      Parameters
 |      ----------
 |      X: array-like, shape (n_samples, n_pixels)
 |          Training vector, where n_samples in the number of samples
 |          and n_pixels is the number of features.
 |      
 |      Returns
 |      -------
 |      self : object
 |          Returns sparse code.
 |  
 |  ----------------------------------------------------------------------
 |  Data descriptors defined here:
 |  
 |  __dict__
 |      dictionary for instance variables (if defined)
 |  
 |  __weakref__
 |      list of weak references to the object (if defined)

loading a database

Loading patches, with or without mask:

N_patches = 12 from shl_scripts.shl_tools import show_data opts_ = opts.copy() opts_.update(verbose=0) for i, (do_mask, label) in enumerate(zip([False, True], ['Without mask', 'With mask'])): data_ = SHL(DEBUG_DOWNSCALE=1, N_patches=N_patches, n_image=1, do_mask=do_mask, seed=seed, **opts_).get_data(matname=tag) fig, axs = show_data(data_) axs[0].set_ylabel(label); plt.show()

Testing different algorithms

fig, ax = None, None for homeo_method in ['None', 'HAP']: for algorithm in ['lasso_lars', 'lars', 'elastic', 'omp', 'mp']: # 'threshold', 'lasso_cd', opts_ = opts.copy() opts_.update(homeo_method=homeo_method, learning_algorithm=algorithm, verbose=0) shl = SHL(**opts_) dico= shl.learn_dico(data=data, list_figures=[], matname=tag + ' - algorithm={}'.format(algorithm) + ' - homeo_method={}'.format(homeo_method)) fig, ax = shl.time_plot(dico, variable='F', fig=fig, ax=ax, label=algorithm +'_' + homeo_method) ax.legend()

Testing two different dictionary initalization strategies

White Noise Initialization + Learning

In [63]:
shl = SHL(one_over_F=False, **opts)
dico_w = shl.learn_dico(data=data, matname=tag + '_WHITE', list_figures=[])
shl = SHL(one_over_F=True, **opts)
dico_1oF = shl.learn_dico(data=data, matname=tag + '_OVF', list_figures=[])
fig_error, ax_error = None, None
fig_error, ax_error = shl.time_plot(dico_w, variable='F', fig=fig_error, ax=ax_error, color='blue', label='white noise')
fig_error, ax_error = shl.time_plot(dico_1oF, variable='F', fig=fig_error, ax=ax_error, color='red', label='one over f')
#ax_error.set_ylim((0, .65))
ax_error.legend(loc='best')
Out[63]:
<matplotlib.legend.Legend at 0x1cfe9ed90>

Testing two different learning rates strategies

We use by defaut the strategy of ADAM, see https://arxiv.org/pdf/1412.6980.pdf

In [64]:
shl = SHL(beta1=0., **opts)
dico_fixed = shl.learn_dico(data=data, matname=tag + '_fixed', list_figures=[])
shl = SHL(**opts)
dico_default = shl.learn_dico(data=data, matname=tag + '_default', list_figures=[])
fig_error, ax_error = None, None
fig_error, ax_error = shl.time_plot(dico_fixed, variable='F', fig=fig_error, ax=ax_error, color='blue', label='fixed')
fig_error, ax_error = shl.time_plot(dico_default, variable='F', fig=fig_error, ax=ax_error, color='red', label='ADAM')
#ax_error.set_ylim((0, .65))
ax_error.legend(loc='best')
Out[64]:
<matplotlib.legend.Legend at 0x209a287d0>

Testing different number of neurons and sparsity

As suggested by AnonReviewer3, we have tested how the convergence was modified by changing the number of neurons. By comparing different numbers of neurons we could re-draw the same figures for the convergence of the algorithm as in our original figures. In addition, we have also checked that this result will hold on a range of sparsity levels. In particular, we found that in general, increasing the l0_sparseness parameter, the convergence took progressively longer. Importantly, we could see that in both cases, this did not depend on the kind of homeostasis heuristic chosen, proving the generality of our results.

This is shown in the supplementary material that we have added to our revision ("Testing different number of neurons and sparsity") . This useful extension proves the originality of our work as highlighted in point 4, and the generality of these results compared to the parameters of the network.

In [65]:
#from shl_scripts.shl_experiments import SHL_set
#homeo_methods = ['None', 'OLS', 'HEH']
homeo_methods = ['None', 'EMP', 'HAP', 'HEH', 'OLS']

variables = ['l0_sparseness', 'n_dictionary']
list_figures = []

#n_dictionary=21**2

for homeo_method in homeo_methods:
    opts_ = opts.copy()
    opts_.update(homeo_method=homeo_method, datapath=datapath)
    experiments = SHL_set(opts_, tag=tag + '_' + homeo_method)
    experiments.run(variables=variables, n_jobs=1, verbose=0)

fig, axs = plt.subplots(len(variables), 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars, sharey=True)

for i_ax, variable in enumerate(variables):
    for color, homeo_method in zip(colors, homeo_methods): 
        opts_ = opts.copy()
        opts_.update(homeo_method=homeo_method, datapath=datapath)
        experiments = SHL_set(opts_, tag=tag + '_' + homeo_method)
        fig, axs[i_ax] = experiments.scan(variable=variable, list_figures=[], display='final', fig=fig, ax=axs[i_ax], color=color, display_variable='F', verbose=0) #, label=homeo_metho
        axs[i_ax].set_xlabel('') #variable
        axs[i_ax].text(.1, .8,  variable, transform=axs[i_ax].transAxes) 
        #axs[i_ax].get_xaxis().set_major_formatter(matplotlib.ticker.ScalarFormatter())
---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
<ipython-input-65-bf4963f6483f> in <module>
     12     opts_.update(homeo_method=homeo_method, datapath=datapath)
     13     experiments = SHL_set(opts_, tag=tag + '_' + homeo_method)
---> 14     experiments.run(variables=variables, n_jobs=1, verbose=0)
     15 
     16 fig, axs = plt.subplots(len(variables), 1, figsize=(fig_width/2, fig_width/(1+phi)), gridspec_kw=subplotpars, sharey=True)

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_experiments.py in run(self, N_scan, variables, n_jobs, list_figures, fig_kwargs, verbose)
    427             for variable, value in zip(variables_, values_):
    428                 shl = prun(variable, value, self.data, self.opts,
--> 429                             self.matname(variable, value), list_figures, fig_kwargs, verbose)
    430                 dico = shl.learn_dico(data=self.data,
    431                             matname=self.matname(variable, value),

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_experiments.py in prun(variable, value, data, opts, matname, list_figures, fig_kwargs, verbose)
    542         data = shl.get_data(**{variable:value})
    543     dico = shl.learn_dico(data=data, matname=matname,
--> 544                 list_figures=list_figures, fig_kwargs=fig_kwargs)
    545     return shl
    546 

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_experiments.py in learn_dico(self, dictionary, precision, P_cum, data, matname, record_each, folder_exp, list_figures, fig_kwargs)
    261 
    262                         dico = self.learn_dico(data=data, dictionary=dictionary, precision=precision, P_cum=P_cum,
--> 263                                                matname=None)
    264                         with open(fmatname, 'wb') as fp:
    265                             pickle.dump(dico, fp)

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_experiments.py in learn_dico(self, dictionary, precision, P_cum, data, matname, record_each, folder_exp, list_figures, fig_kwargs)
    241 
    242             if self.verbose: print('Training on %d patches' % len(data))
--> 243             dico.fit(data)
    244 
    245             if self.verbose:

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_learn.py in fit(self, X, y)
    154                                   record_each=self.record_each,
    155                                   record_num_batches=self.record_num_batches,
--> 156                                   verbose=self.verbose
    157                                   )
    158 

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_learn.py in dict_learning(X, dictionary, precision, eta, beta1, beta2, epsilon, homeo_method, eta_homeo, alpha_homeo, C, nb_quant, P_cum, n_dictionary, l0_sparseness, fit_tol, alpha_MP, do_precision, eta_precision, n_iter, one_over_F, batch_size, record_each, record_num_batches, verbose, method, do_sym)
    372                                     algorithm=method, fit_tol=fit_tol,
    373                                     P_cum=P_cum, C=C, do_sym=do_sym, l0_sparseness=l0_sparseness,
--> 374                                     gain=gain, alpha_MP=alpha_MP)
    375         residual = this_X - sparse_code @ dictionary
    376 

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_encode.py in sparse_encode(X, dictionary, precision, algorithm, fit_tol, P_cum, l0_sparseness, C, do_sym, verbose, gain, alpha_MP)
    125         sparse_code = mp(X, dictionary, precision, l0_sparseness=l0_sparseness,
    126                          fit_tol=fit_tol, P_cum=P_cum, C=C, do_sym=do_sym,
--> 127                          verbose=verbose, gain=gain, alpha_MP=alpha_MP)
    128     else:
    129         raise ValueError('Sparse coding method must be "mp", "lasso_lars" '

~/science/HULK/SparseHebbianLearning/shl_scripts/shl_encode.py in mp(X, dictionary, precision, l0_sparseness, fit_tol, alpha_MP, do_sym, P_cum, do_fast, C, verbose, gain)
    306             a_ind = corr[line, ind] / squared_norm[ind]**2 # size (K,)
    307             sparse_code[line, ind] += a_ind # size (K,)
--> 308             corr -=  a_ind[:, np.newaxis] * Xcorr[ind, :] # size (K, N)
    309 
    310     if verbose>0:

KeyboardInterrupt: 

Perspectives

Convolutional neural networks

In [66]:
from CHAMP.DataLoader import LoadData
from CHAMP.DataTools import LocalContrastNormalization, FilterInputData, GenerateMask
from CHAMP.Monitor import DisplayDico, DisplayConvergenceCHAMP, DisplayWhere

import os
home = os.getenv('HOME')
datapath = os.path.join("/tmp", "database")
path = os.path.join(datapath, "Raw_DataBase")
TrSet, TeSet = LoadData('Face', path, decorrelate=False, resize=(65, 65))
to_display = TrSet[0][0, 0:10, :, :, :]
print('Size=', TrSet[0].shape)
DisplayDico(to_display)
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-66-872b58de3d9d> in <module>
      7 datapath = os.path.join("/tmp", "database")
      8 path = os.path.join(datapath, "Raw_DataBase")
----> 9 TrSet, TeSet = LoadData('Face', path, decorrelate=False, resize=(65, 65))
     10 to_display = TrSet[0][0, 0:10, :, :, :]
     11 print('Size=', TrSet[0].shape)

~/pool/science/CHAMP/CHAMP/DataLoader.py in LoadData(name, data_path, decorrelate, avg_size, Grayscale, resize, GPU, download)
     68         if resize is None:
     69             resize = (65, 65)
---> 70         data_training = LoadFaceDB(data_path, size=resize, to_shuffle=True)
     71         data_testing = (data_training[0].clone(), data_training[1].clone())
     72 

~/pool/science/CHAMP/CHAMP/DataLoader.py in LoadFaceDB(path, size, to_shuffle)
     90     tensor_label = torch.LongTensor(1, batch_size)
     91     label = 0
---> 92     for each_dir in os.listdir(path):
     93         if each_dir != '.DS_Store':
     94             try:

FileNotFoundError: [Errno 2] No such file or directory: '/tmp/database/Raw_DataBase'

Training on a face database

In [ ]:
# MP Parameters
nb_dico = 20
width = 9
dico_size = (width, width)
l0 = 20
seed = 42
# Learning Parameters
eta = .05
nb_epoch = 500

TrSet, TeSet = LoadData('Face', path, decorrelate=False, resize=(65, 65))
N_TrSet, _, _, _ = LocalContrastNormalization(TrSet)
Filtered_L_TrSet = FilterInputData(
    N_TrSet, sigma=0.25, style='Custom', start_R=15)
to_display = Filtered_L_TrSet[0][0, 0:10, :, :, :]
DisplayDico(to_display)

mask = GenerateMask(full_size=(nb_dico, 1, width, width), sigma=0.8, style='Gaussian')
DisplayDico(mask)

Training the ConvMP Layer with homeostasis

In [ ]:
from CHAMP.CHAMP_Layer import CHAMP_Layer

from CHAMP.DataTools import SaveNetwork, LoadNetwork
fname = 'cache_dir_CNN/CHAMP_low_None.pkl'
try:
    L1_mask = LoadNetwork(loading_path=fname)
except:
    L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico,
                      dico_size=dico_size, mask=mask, verbose=2)
    dico_mask = L1_mask.TrainLayer(
        Filtered_L_TrSet, eta=eta, nb_epoch=nb_epoch, seed=seed)
    SaveNetwork(Network=L1_mask, saving_path=fname)

DisplayDico(L1_mask.dictionary)
DisplayConvergenceCHAMP(L1_mask, to_display=['error', 'histo'])
DisplayWhere(L1_mask.where)

Training the ConvMP Layer with homeostasis

In [ ]:
fname = 'cache_dir_CNN/CHAMP_low_HAP.pkl'
try:
    L1_mask = LoadNetwork(loading_path=fname)
except:

    # Learning Parameters
    eta_homeo = 0.0025
    L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico,
                          dico_size=dico_size, mask=mask, verbose=1)
    dico_mask = L1_mask.TrainLayer(
        Filtered_L_TrSet, eta=eta, eta_homeo=eta_homeo, nb_epoch=nb_epoch, seed=seed)
    SaveNetwork(Network=L1_mask, saving_path=fname)

DisplayDico(L1_mask.dictionary)
DisplayConvergenceCHAMP(L1_mask, to_display=['error'])
DisplayConvergenceCHAMP(L1_mask, to_display=['histo'])
DisplayWhere(L1_mask.where)

Reconstructing the input image

In [ ]:
from CHAMP.DataTools import Rebuilt
import torch
rebuilt_image = Rebuilt(torch.FloatTensor(L1_mask.code), L1_mask.dictionary)
DisplayDico(rebuilt_image[0:10, :, :, :]);

Training the ConvMP Layer with higher-level filters

We train higher-level feature vectors by forcing the network to :

  • learn bigger filters,
  • represent the information using a bigger dictionary (higher sparseness)
  • represent the information with less features (higher sparseness)
In [ ]:
fname = 'cache_dir_CNN/CHAMP_high_None.pkl'
try:
    L1_mask = LoadNetwork(loading_path=fname)
except:

    nb_dico = 60
    width = 19
    dico_size = (width, width)
    l0 = 5
    mask = GenerateMask(full_size=(nb_dico, 1, width, width), sigma=0.8, style='Gaussian')
    # Learning Parameters
    eta_homeo = 0.0
    eta = .05
    nb_epoch = 500
    # learn
    L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico,
                          dico_size=dico_size, mask=mask, verbose=0)
    dico_mask = L1_mask.TrainLayer(
        Filtered_L_TrSet, eta=eta, eta_homeo=eta_homeo, nb_epoch=nb_epoch, seed=seed)
    SaveNetwork(Network=L1_mask, saving_path=fname)


DisplayDico(L1_mask.dictionary)
DisplayConvergenceCHAMP(L1_mask, to_display=['error'])
DisplayConvergenceCHAMP(L1_mask, to_display=['histo'])
DisplayWhere(L1_mask.where);
In [ ]:
fname = 'cache_dir_CNN/CHAMP_high_HAP.pkl'
try:
    L1_mask = LoadNetwork(loading_path=fname)
except:

    nb_dico = 60
    width = 19
    dico_size = (width, width)
    l0 = 5
    mask = GenerateMask(full_size=(nb_dico, 1, width, width), sigma=0.8, style='Gaussian')
    # Learning Parameters
    eta_homeo = 0.0025
    eta = .05
    nb_epoch = 500
    # learn
    L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico,
                          dico_size=dico_size, mask=mask, verbose=0)
    dico_mask = L1_mask.TrainLayer(
        Filtered_L_TrSet, eta=eta, eta_homeo=eta_homeo, nb_epoch=nb_epoch, seed=seed)
    SaveNetwork(Network=L1_mask, saving_path=fname)

DisplayDico(L1_mask.dictionary)
DisplayConvergenceCHAMP(L1_mask, to_display=['error'])
DisplayConvergenceCHAMP(L1_mask, to_display=['histo'])
DisplayWhere(L1_mask.where);

Training on MNIST database

fname = 'cache_dir_CNN/CHAMP_MNIST_HAP.pkl' try: L1_mask = LoadNetwork(loading_path=fname) except: path = os.path.join(datapath, "MNISTtorch") TrSet, TeSet = LoadData('MNIST', data_path=path) N_TrSet, _, _, _ = LocalContrastNormalization(TrSet) Filtered_L_TrSet = FilterInputData( N_TrSet, sigma=0.25, style='Custom', start_R=15) nb_dico = 60 width = 7 dico_size = (width, width) l0 = 15 # Learning Parameters eta_homeo = 0.0025 eta = .05 nb_epoch = 500 # learn L1_mask = CHAMP_Layer(l0_sparseness=l0, nb_dico=nb_dico, dico_size=dico_size, mask=mask, verbose=2) dico_mask = L1_mask.TrainLayer( Filtered_L_TrSet, eta=eta, eta_homeo=eta_homeo, nb_epoch=nb_epoch, seed=seed) SaveNetwork(Network=L1_mask, saving_path=fname) DisplayDico(L1_mask.dictionary) DisplayConvergenceCHAMP(L1_mask, to_display=['error']) DisplayConvergenceCHAMP(L1_mask, to_display=['histo']) DisplayWhere(L1_mask.where);

Computational details

caching simulation data

In [ ]:
!ls -l {shl.cache_dir}/{tag}*
#!rm {shl.cache_dir}/{tag}*lock*
#!rm {shl.cache_dir}/{tag}*
#!ls -l {shl.cache_dir}/{tag}*
In [ ]:
%run model.py {tag} 0
%run model.py 35

Version used

In [ ]:
%load_ext watermark
%watermark -i -h -m -v -p numpy,matplotlib,shl_scripts

exporting the notebook

In [ ]:
!jupyter nbconvert --to html_embed Annex.ipynb --output=index.html
In [ ]:
#!jupyter-nbconvert --template report --to pdf Annex.ipynb
In [ ]:
#!pandoc Annex.html -o Annex.pdf
In [ ]:
#!/Applications/Chromium.app/Contents/MacOS/Chromium --headless --disable-gpu --print-to-pdf=Annex.pdf file:///tmp/Annex.html
In [ ]:
#!zip Annex.zip Annex.html

version control

In [ ]:
!git status
In [ ]:
!git pull
In [ ]:
!git commit -am' {tag} : re-running notebooks' 
In [ ]:
!git push
Done. Thanks for your attention!